On improving robustness of LDA and SRDA by using tangent vectors
نویسندگان
چکیده
In the area of pattern recognition, it is common for few training samples to be available with respect to the dimensionality of the representation space; this is known as the curse of dimensionality. This problem can be alleviated by using a dimensionality reduction approach, which overcomes the curse relatively well. Moreover, supervised dimensionality reduction techniques generally provide better recognition performance; however, several of these tend to suffer from the curse when applied directly to high-dimensional spaces. We propose to overcome this problem by incorporating additional information to supervised subspace learning techniques using what is known as tangent vectors. This additional information accounts for the possible differences that the sample data can suffer. In fact, this can be seen as a way to model the unseen data and make better use of the scarce training samples. In this paper, methods for incorporating tangent vector information are described for one classical technique (LDA) and one state-of-the-art technique (SRDA). Experimental results confirm that this additional information improves performance and robustness to known transformations.
منابع مشابه
Improving Chernoff criterion for classification by using the filled function
Linear discriminant analysis is a well-known matrix-based dimensionality reduction method. It is a supervised feature extraction method used in two-class classification problems. However, it is incapable of dealing with data in which classes have unequal covariance matrices. Taking this issue, the Chernoff distance is an appropriate criterion to measure distances between distributions. In the p...
متن کاملRecognition of handwritten digit with transformed invariant distance - Project Final Report - Due 12 / 14 / 07 Buyoung
This paper presents a method to classify handwritten digit based on tangent vectors, which are the linear derivatives of transformations. The purpose of tangent vector is to find the distance between manifolds; a substitute of the classical Euclidean distance. Using the tangent vector, a satisfying performance was achieved, invariant to transformation. While implementing the classifier, we impr...
متن کاملOn the k-nullity foliations in Finsler geometry
Here, a Finsler manifold $(M,F)$ is considered with corresponding curvature tensor, regarded as $2$-forms on the bundle of non-zero tangent vectors. Certain subspaces of the tangent spaces of $M$ determined by the curvature are introduced and called $k$-nullity foliations of the curvature operator. It is shown that if the dimension of foliation is constant, then the distribution is involutive...
متن کاملCharacterizations of Slant Ruled Surfaces in the Euclidean 3-space
In this study, we give the relationships between the conical curvatures of ruled surfaces generated by the unit vectors of the ruling, central normal and central tangent of a ruled surface in the Euclidean 3-space E^3. We obtain differential equations characterizing slant ruled surfaces and if the reference ruled surface is a slant ruled surface, we give the conditions for the surfaces generate...
متن کاملImproved Pseudoinverse Linear Discriminant Analysis Method for Dimensionality Reduction
Dimensionality reduction is an important aspect of pattern classi ̄cation. It helps in improving the robustness (or generalization capability) of the pattern classi ̄er and in reducing its computational complexity. The linear discriminant analysis (LDA) method is a well-known dimensionality reduction technique studied in the literature. The LDA technique ̄nds an orientation matrix W that transform...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- Pattern Recognition Letters
دوره 34 شماره
صفحات -
تاریخ انتشار 2013